Skip to content

Conversation

@KirschX
Copy link
Contributor

@KirschX KirschX commented Oct 28, 2025

Background

#9780

In order to use finishReason with useChat on the frontend, users currently have to manually send it from the server—either by adding it to metadata or using createUIMessageStream.

I raised a question in the discussion tab, and it seems worthwhile to include finishReason by default so developers can access it without additional setup.

Summary

This PR adds the finishReason parameter to the onFinish callback in useChat, allowing developers to know why a stream finished (stop, length, content-filter, tool-calls, etc.).

Changes:

  • Added finishReason to ChatOnFinishCallback type
  • Updated UI message stream handling to propagate finishReason
  • Updated tests to verify finishReason is passed correctly
  • Added changeset for patch release

Manual Verification

Checklist

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • I have reviewed this pull request (self-review)

Future Work

Related Issues

closes #9780

@gr2m gr2m added the ai/ui label Oct 28, 2025
Copy link
Collaborator

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for the pull request.

For manual verification, I would use one examples/next-openai and update one of the examples in there to some how handle the finish reason, to make sure it works as expected for the most common cases

@@ -1,3 +1,4 @@
import { FinishReason } from '../types/language-model';
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this can be removed?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gr2m
My miss.. removed it.

expect(letOnFinishArgs).toMatchInlineSnapshot(`
[
{
"finishReason": undefined,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't looked in detail yet, but is there a case where finishReason would be undefined? We might need to update our fixtures in the tests

Copy link
Contributor Author

@KirschX KirschX Oct 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, Yes, finishReason can be undefined when there’s no finish event (abort, disconnect, server error). For normal completions it should be set (usually 'stop').

I’ve updated fixtures: normal paths include finishReason: 'stop'; abnormal paths keep it undefined.

@KirschX KirschX requested a review from gr2m October 29, 2025 10:12
This PR adds the finishReason parameter to the onFinish callback in useChat, allowing developers to know why a stream finished (stop, length, content-filter, tool-calls, etc.).

Changes:
- Added finishReason to ChatOnFinishCallback type
- Updated UI message stream handling to propagate finishReason
- Updated tests to verify finishReason is passed correctly
- Added changeset for patch release
@KirschX KirschX force-pushed the feat/use-chat-onfinish-finish-reason branch from 3b8a2ad to fe0f666 Compare October 30, 2025 17:42
@KirschX
Copy link
Contributor Author

KirschX commented Oct 30, 2025

thanks for the pull request.

For manual verification, I would use one examples/next-openai and update one of the examples in there to some how handle the finish reason, to make sure it works as expected for the most common cases

I updated one of a example file in 'examples/next-openai'(use-chat-data-ui-parts).

Sorry for late update after requesting review.

@KirschX
Copy link
Contributor Author

KirschX commented Nov 8, 2025

@gr2m
Also, I'm wondering if it would be necessary to update chat.svelte.test.ts, chat.vue.ui.test.tsx, chat.ng.test.ts, and 01-use-chat.mdx as well.
(I found them updated on related pr which already accepted
https://github.com/vercel/ai/pull/8364/files#diff-a66327f8723dc9beda251f5a7d76620ecc90e730806f63e570e49df26044ddc1)

I would appreciate your feedback whenever you have a moment.

@gr2m
Copy link
Collaborator

gr2m commented Nov 10, 2025

update chat.svelte.test.ts, chat.vue.ui.test.tsx, chat.ng.test.ts, and 01-use-chat.mdx as well

no need to update the UI framework tests. The #8364 (files) updated them because it had to.

But please update 01-use-chat.mdx1

@gr2m gr2m added the feature New feature or request label Nov 10, 2025
@gr2m gr2m self-assigned this Nov 10, 2025
Copy link
Collaborator

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you Jung, great pull request! Congratulations on landing your first contribution to the AI SDK! 💐

@gr2m gr2m enabled auto-merge (squash) November 10, 2025 15:48
@gr2m gr2m added the backport label Nov 10, 2025
@gr2m gr2m merged commit a322efa into vercel:main Nov 10, 2025
17 of 18 checks passed
vercel-ai-sdk bot pushed a commit that referenced this pull request Nov 10, 2025
@vercel-ai-sdk vercel-ai-sdk bot removed the backport label Nov 10, 2025
@vercel-ai-sdk
Copy link
Contributor

vercel-ai-sdk bot commented Nov 10, 2025

⚠️ Backport to release-v5.0 created but has conflicts: #10126

gr2m added a commit that referenced this pull request Nov 10, 2025
…0126)

This is an automated backport of #9857 to the release-v5.0 branch. FYI
@KirschX

---------

Co-authored-by: KirschX <[email protected]>
Co-authored-by: Gregor Martynus <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ai/ui feature New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants